"Safety isn't just a statistic; it's a feeling you hold when you're walking down the street."
The 2023 City Performance Survey — the most recent available — recorded the lowest safety satisfaction since the survey began in 1996. Only 63% of residents reported feeling safe walking during the day, down from 85% in 2019. At night, only 36% feel safe, down from 53%. Meanwhile, SFPD data shows reported crime has been declining. This is the perception gap.
The 2025 CityBeat poll from the SF Chamber of Commerce suggests perception may be starting to shift — 78% of weekly downtown visitors now report feeling safe during the day, and the share saying SF is "on the right track" doubled from 22% to 43%. But we can only see these shifts through expensive one-off polls. Public Safety Pulse would make this signal continuous and block-level.
Source: SF Chamber of Commerce CityBeat 2025 Poll. All figures from published poll results.
This map combines all data sources — 311 disorder reports, SFPD incidents, and traffic crashes — weighted by their impact on perception. Encampments and violent crime weigh more heavily than graffiti or property crime. Press play to watch hotspots shift through six 4-hour windows across a typical day.
The SPI estimates how safe each neighborhood feels by fusing 715,182 data points across four datasets. It is a proxy estimate — not a direct measurement of perception. Scale: 0 (least safe feeling) to 100 (safest). Phase 1 would calibrate these scores against real sentiment.
Positive = more disorder than crime (perception problem). Negative = more crime than disorder (hidden risk).
How 311 report rates shift across six 4-hour windows. Darker cells = more reports per month. This is the variation that a biennial survey cannot see.
| Dataset | Records | Period | SPI Role | Weight |
|---|---|---|---|---|
| 311 Service Requests vw6y-z8j6 | 626,911 | Feb 2025–Feb 2026 | Disorder density, salience, temporal | 30% + 15% + 15% |
| SFPD Incidents wg3w-h783 | 85,804 | Feb 2025–Feb 2026 | Crime severity density | 20% |
| Traffic Crashes ubvf-ztfx | 2,467 | 12 months | Pedestrian safety | 10% |
| Reddit r/sanfrancisco | 319 | Various | Community sentiment baseline | 10% |
| City Survey 2023 | — | 2023 | Validation reference | — |
| CityBeat 2025 Poll | — | 2025 | Calibration reference | — |
SPI = 100 − scaled( 0.30×D + 0.20×C + 0.15×DC + 0.10×PS + 0.15×TR + 0.10×CS )
D = z-score( Σ(311 cases × e^(-days/180)) / area_km² )
C = z-score( Σ(SFPD incidents × severity × e^(-days/180)) / area_km² )
DC = z-score( mean salience weight of 311 categories )
PS = z-score( traffic crashes / area_km² )
TR = z-score( 0.6 × night_ratio + 0.4 × trend_ratio )
CS = keyword sentiment from Reddit (citywide baseline, neighborhood-specific where available)
| Choice | Approach | Rationale |
|---|---|---|
| Normalization | Z-score, capped at ±3σ | Handles different scales; robust to outliers vs naive min-max |
| Temporal decay | Exponential, half-life ~6 months | Recent events affect current perception more than old ones |
| Crime weighting | Severity scale (1–5) | Violent crime affects perception more than property crime |
| Disorder salience | Category weights (0.2–1.0) | Encampments affect perception more than graffiti (broken windows) |
| Spatial unit | SF Planning neighborhoods | Matches City Survey for validation; ~40 zones |
Reporting bias: 311 reflects who reports, not what exists. Engaged neighborhoods over-report.
Survival bias: Areas people avoid generate fewer data points and appear safer than they feel.
No direct perception: Everything here is inferred from proxy data. Phase 1 collects the real signal.
Reddit signal: Only 319 posts, not geocoded — applied as crude baseline, not precise neighborhood signal.
Weight calibration: Component weights are from research literature, not empirically calibrated to SF perception data.
Phase 1 would enable proper calibration via regression against direct sentiment responses.
Wilson & Kelling (1982) "Broken Windows" — visible disorder signals predict perceived unsafety.
Sampson & Raudenbush (1999) — systematic observation of disorder correlates with fear of crime.
Salesses, Schechtner & Hidalgo (2013) "Place Pulse" — crowdsourced urban perception mapping (MIT Media Lab).
Naik et al. (2014) "Streetscore" — computer vision for perceived safety from Google Street View (MIT).
These inform our salience weighting: encampments and visible waste are weighted higher than graffiti or noise.
311 captures what people report. Crime data captures what police file. Areas people avoid appear safe in the data. Reddit captures what the online community discusses. None of this is a direct measurement of how people feel. We need the actual signal.
| What We Have Now (Proxy) | What Phase 1 Adds (Direct) |
|---|---|
| 311 complaints — lagging, reporter bias | Direct, in-the-moment perception |
| Crime incidents — only what gets reported | Real-time safety sentiment |
| Biennial survey — 2-year lag, neighborhood level | Daily signal, block level |
| SPI from proxy data — inferred, unvalidated | Calibrated index with ground truth |
| Reddit keywords — crude, not geocoded | Geocoded NPS from digital touchpoints |
"Right now, how does the surrounding area feel to you?" — Comfortable / Neutral / Uncomfortable. Delivered through existing digital touchpoints during normal daily activity. Anonymous. Aggregated by place and time.
Phase 1 validates signal quality. Phase 2 builds a correlation engine — mapping which observable conditions predict perception. Phase 2 also tests interventions (sights: cleaning, lighting, ambassadors; sounds: street musicians; smells: pleasant aromas; civic signals: responsive service). Measure impact in near-real-time. This creates a continuous improvement cycle: measure → correlate → intervene → re-measure.
City Science Lab San Francisco × MIT Media Lab City Science